International Journal of Reliable Information and Assurance
Volume 2, No. 2, 2014, pp 13-24 | ||
Abstract |
On the significance of Non-Extensivity for the Dynamic Growth of Hidden-Layer Neurons
|
In this paper we extend our work on the dynamic growth of hidden layer neurons using the weighted sum of non-extensive entropies in [25] by investigating the significance of the non-extensivity of the entropy measure used. The dynamic neural network in the original work dynamically grows the number of the hidden-layer neurons based on an increase in the entropy of the weights during training. We compare the performance of the non-extensive entropy with Gaussian information gain proposed by Susan and Hanmandlu used in the original paper with that of the non-extensive entropies of Pal and Pal and Tsallis and also the extensive Shannon entropy. Experiments on benchmark datasets from the UCI repository further validate the correctness of our approach and establish the non-linearity of the non-extensive Susan and Hanmandlu entropy as being best suited for this purpose